An Asymptotical Variational Principle Associated with the Steepest Descent Method for a Convex Function

نویسنده

  • B. Lemaire
چکیده

The asymptotical limit of the trajectory deened by the continuous steepest descent method for a proper closed convex function f on a Hilbert space is characterized in the set of minimizers of f via an asymp-totical variational principle of Brezis-Ekeland type. The implicit discrete analogue (prox method) is also considered.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hybrid steepest-descent method with sequential and functional errors in Banach space

Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...

متن کامل

$(varphi_1, varphi_2)$-variational principle

In this paper we prove that if $X $ is a Banach space, then for every lower semi-continuous bounded below function $f, $ there exists a $left(varphi_1, varphi_2right)$-convex function $g, $ with arbitrarily small norm,  such that $f + g $ attains its strong minimum on $X. $ This result extends some of the  well-known varitional principles as that of Ekeland [On the variational principle,  J. Ma...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

Strong Weak Convergence Theorems of Implicit Hybrid Steepest-descent Methods for Variational Inequalities

Assume that F is a nonlinear operator on a real Hilbert space H which is strongly monotone and Lipschitzian with constants η > 0 and κ > 0, respectively on a nonempty closed convex subset C of H . Assume also that C is the intersection of the fixed point sets of a finite number of nonexpansive mappings on H . We develop an implicit hybrid steepest-descent method which generates an iterative seq...

متن کامل

Efficient implementation of a modified and relaxed hybrid steepest-descent method for a type of variational inequality

To reduce the difficulty and complexity in computing the projection from a real Hilbert space onto a nonempty closed convex subset, researchers have provided a hybrid steepest-descent method for solving VI(F,K) and a subsequent three-step relaxed version of this method. In a previous study, the latter was used to develop a modified and relaxed hybrid steepest-descent (MRHSD) method. However, ch...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996